Goto

Collaborating Authors

 reliable training and estimation




NeurIPS Rebuttal for " Reliable training and estimation of variance networks "

Neural Information Processing Systems

We thank the reviewers for their constructive and fair reviews. We will discuss two shared concerns, and then move to individual reviewers. This is an interesting venue for further research. We can rely on fast approximate nearest neighbor algorithms. Thus, the two processes can in principle be performed in parallel.


Reviews: Reliable training and estimation of variance networks

Neural Information Processing Systems

Post-Rebuttal Feedback Thank the reviewers for your feedback. I think this is a good paper to appear in NeurIPS. This paper tackles the uncertainty prediction via directly predicting the marginal mean and variances. For assuring the reliability of its uncertainty estimation, the paper presents a series of interesting techniques for training the prediction network, including location-aware mini-batching, mean-variance split training and variance networks. With all these techniques adopted, the paper demonstrates convincing empirical results on its uncertainty estimation. Weakness, I am surprised by the amazing empirical performance and the simplicity of the method.


Reviews: Reliable training and estimation of variance networks

Neural Information Processing Systems

The authors identify problems with estimating predictive variance using neural networks, and propose solutions to fix them. All the reviewers agreed that the paper is well-written, clearly highlighting the limitations of current methods and demonstrating that the proposed solution works better. The reviewers gave some suggestions to improve the paper, and raised some questions about computational complexity and scalability to high dimensions. I encourage the authors to take these into account when they prepare the final version.


Reliable training and estimation of variance networks

Neural Information Processing Systems

We propose and investigate new complementary methodologies for estimating predictive variance networks in regression neural networks. We derive a locally aware mini-batching scheme that results in sparse robust gradients, and we show how to make unbiased weight updates to a variance network. Further, we formulate a heuristic for robustly fitting both the mean and variance networks post hoc. Finally, we take inspiration from posterior Gaussian processes and propose a network architecture with similar extrapolation properties to Gaussian processes. The proposed methodologies are complementary, and improve upon baseline methods individually. Experimentally, we investigate the impact of predictive uncertainty on multiple datasets and tasks ranging from regression, active learning and generative modeling.


Reliable training and estimation of variance networks

Skafte, Nicki, Jørgensen, Martin, Hauberg, Søren

Neural Information Processing Systems

We propose and investigate new complementary methodologies for estimating predictive variance networks in regression neural networks. We derive a locally aware mini-batching scheme that results in sparse robust gradients, and we show how to make unbiased weight updates to a variance network. Further, we formulate a heuristic for robustly fitting both the mean and variance networks post hoc. Finally, we take inspiration from posterior Gaussian processes and propose a network architecture with similar extrapolation properties to Gaussian processes. The proposed methodologies are complementary, and improve upon baseline methods individually. Experimentally, we investigate the impact of predictive uncertainty on multiple datasets and tasks ranging from regression, active learning and generative modeling.